451 research outputs found

    Evaluating model accuracy for model-based reasoning

    Get PDF
    Described here is an approach to automatically assessing the accuracy of various components of a model. In this approach, actual data from the operation of a target system is used to drive statistical measures to evaluate the prediction accuracy of various portions of the model. We describe how these statistical measures of model accuracy can be used in model-based reasoning for monitoring and design. We then describe the application of these techniques to the monitoring and design of the water recovery system of the Environmental Control and Life Support System (ECLSS) of Space Station Freedom

    A model-based reasoning approach to sensor placement for monitorability

    Get PDF
    An approach is presented to evaluating sensor placements to maximize monitorability of the target system while minimizing the number of sensors. The approach uses a model of the monitored system to score potential sensor placements on the basis of four monitorability criteria. The scores can then be analyzed to produce a recommended sensor set. An example from our NASA application domain is used to illustrate our model-based approach to sensor placement

    Almost Settling the Hardness of Noncommutative Determinant

    Full text link
    In this paper, we study the complexity of computing the determinant of a matrix over a non-commutative algebra. In particular, we ask the question, "over which algebras, is the determinant easier to compute than the permanent?" Towards resolving this question, we show the following hardness and easiness of noncommutative determinant computation. * [Hardness] Computing the determinant of an n \times n matrix whose entries are themselves 2 \times 2 matrices over a field is as hard as computing the permanent over the field. This extends the recent result of Arvind and Srinivasan, who proved a similar result which however required the entries to be of linear dimension. * [Easiness] Determinant of an n \times n matrix whose entries are themselves d \times d upper triangular matrices can be computed in poly(n^d) time. Combining the above with the decomposition theorem of finite dimensional algebras (in particular exploiting the simple structure of 2 \times 2 matrix algebras), we can extend the above hardness and easiness statements to more general algebras as follows. Let A be a finite dimensional algebra over a finite field with radical R(A). * [Hardness] If the quotient A/R(A) is non-commutative, then computing the determinant over the algebra A is as hard as computing the permanent. * [Easiness] If the quotient A/R(A) is commutative and furthermore, R(A) has nilpotency index d (i.e., the smallest d such that R(A)d = 0), then there exists a poly(n^d)-time algorithm that computes determinants over the algebra A. In particular, for any constant dimensional algebra A over a finite field, since the nilpotency index of R(A) is at most a constant, we have the following dichotomy theorem: if A/R(A) is commutative, then efficient determinant computation is feasible and otherwise determinant is as hard as permanent.Comment: 20 pages, 3 figure

    Issues in knowledge representation to support maintainability: A case study in scientific data preparation

    Get PDF
    Scientific data preparation is the process of extracting usable scientific data from raw instrument data. This task involves noise detection (and subsequent noise classification and flagging or removal), extracting data from compressed forms, and construction of derivative or aggregate data (e.g. spectral densities or running averages). A software system called PIPE provides intelligent assistance to users developing scientific data preparation plans using a programming language called Master Plumber. PIPE provides this assistance capability by using a process description to create a dependency model of the scientific data preparation plan. This dependency model can then be used to verify syntactic and semantic constraints on processing steps to perform limited plan validation. PIPE also provides capabilities for using this model to assist in debugging faulty data preparation plans. In this case, the process model is used to focus the developer's attention upon those processing steps and data elements that were used in computing the faulty output values. Finally, the dependency model of a plan can be used to perform plan optimization and runtime estimation. These capabilities allow scientists to spend less time developing data preparation procedures and more time on scientific analysis tasks. Because the scientific data processing modules (called fittings) evolve to match scientists' needs, issues regarding maintainability are of prime importance in PIPE. This paper describes the PIPE system and describes how issues in maintainability affected the knowledge representation used in PIPE to capture knowledge about the behavior of fittings

    Structure-Substrate Binding Relationships of HIV-1 Reverse Transcriptase

    Get PDF
    Human Immunodeficiency Virus, type 1 (HIV-1), is the causative agent of the Acquired Immunodeficiency Syndrome (AIDS). HIV-1 reverse transcriptase (RT), a heterodimer p66/p51, has been the major target for treatment of AIDS. The significance of the p51 subunit and the RNase H domain of p66 in terms of their influence on the RNA-dependent DNA synthesis was investigated. Clones of the wildtype HIV-1 RT subunits, p66 and p51, and a recombinant C-terminal deletion mutant, p64, [Barr, P. J. (1987) Bio/Technoloav 5, 486-489] were employed to study the structure-substrate binding relationships of HIV-1 RT. The activity assays of RNA-dependent DNA synthesis on both poly(rA)(dT) and a random base RNA template hybridized with a DNA oligomer showed that p51 significantly affects the enzyme activity. The increase in processivity by p51 in the p66/p51 heterodimer was also demonstrated. These observations suggested that the integrity of p51 is important in subunit-interactions for maintaining a favorable conformation of the enzyme for optimal function. C-terminal deletion in p66 was seen to decrease the processivity. The dissociation constant (Kd) for poly(rA)(dT) obtained by nitrocellulose binding assays suggested that the processivity of HIV 1 RT on poly(rA)(dT) correlated with the affinity for the substrate. The processivity of RT on RNA335-DNA20 was seen to be affected by the pause sites observed on the autoradiograms. The pauses of DNA synthesis tended to occur at positions of template containing poly G-C sequences. The order of processivity observed on RNA335-DNA20 was p64/p64, p66/p66 \u3c p64/p51 \u3c p66/p51. The C-terminal deletion in p66 was shown to affect the ability to extend the DNA strand on RNA template. In those non-wildtype forms of HIV-1 RT (p66/p66, p64/p64, and p64/p51), the affinity for primer-template seemed to be sensitive to the structure of the RNA template as seen when comparing Kds between poly(rA)(dT) and RNA335-DNA20. The wildtype enzyme, p66/p51, appeared to have a similar affinity for both substrates

    Learning to integrate reactivity and deliberation in uncertain planning and scheduling problems

    Get PDF
    This paper describes an approach to planning and scheduling in uncertain domains. In this approach, a system divides a task on a goal by goal basis into reactive and deliberative components. Initially, a task is handled entirely reactively. When failures occur, the system changes the reactive/deliverative goal division by moving goals into the deliberative component. Because our approach attempts to minimize the number of deliberative goals, we call our approach Minimal Deliberation (MD). Because MD allows goals to be treated reactively, it gains some of the advantages of reactive systems: computational efficiency, the ability to deal with noise and non-deterministic effects, and the ability to take advantage of unforseen opportunities. However, because MD can fall back upon deliberation, it can also provide some of the guarantees of classical planning, such as the ability to deal with complex goal interactions. This paper describes the Minimal Deliberation approach to integrating reactivity and deliberation and describe an ongoing application of the approach to an uncertain planning and scheduling domain

    Intelligent assistance in scientific data preparation

    Get PDF
    Scientific data preparation is the process of extracting usable scientific data from raw instrument data. This task involves noise detection (and subsequent noise classification and flagging or removal), extracting data from compressed forms, and construction of derivative or aggregate data (e.g. spectral densities or running averages). A software system called PIPE provides intelligent assistance to users developing scientific data preparation plans using a programming language called Master Plumber. PIPE provides this assistance capability by using a process description to create a dependency model of the scientific data preparation plan. This dependency model can then be used to verify syntactic and semantic constraints on processing steps to perform limited plan validation. PIPE also provides capabilities for using this model to assist in debugging faulty data preparation plans. In this case, the process model is used to focus the developer's attention upon those processing steps and data elements that were used in computing the faulty output values. Finally, the dependency model of a plan can be used to perform plan optimization and run time estimation. These capabilities allow scientists to spend less time developing data preparation procedures and more time on scientific analysis tasks

    Automated synthesis of image processing procedures using AI planning techniques

    Get PDF
    This paper describes the Multimission VICAR (Video Image Communication and Retrieval) Planner (MVP) (Chien 1994) system, which uses artificial intelligence planning techniques (Iwasaki & Friedland, 1985, Pemberthy & Weld, 1992, Stefik, 1981) to automatically construct executable complex image processing procedures (using models of the smaller constituent image processing subprograms) in response to image processing requests made to the JPL Multimission Image Processing Laboratory (MIPL). The MVP system allows the user to specify the image processing requirements in terms of the various types of correction required. Given this information, MVP derives unspecified required processing steps and determines appropriate image processing programs and parameters to achieve the specified image processing goals. This information is output as an executable image processing program which can then be executed to fill the processing request

    Attention focussing and anomaly detection in real-time systems monitoring

    Get PDF
    In real-time monitoring situations, more information is not necessarily better. When faced with complex emergency situations, operators can experience information overload and a compromising of their ability to react quickly and correctly. We describe an approach to focusing operator attention in real-time systems monitoring based on a set of empirical and model-based measures for determining the relative importance of sensor data
    • …
    corecore